385 research outputs found

    Offscreen and in the chair next to your: conversational agents speaking through actual human bodies

    Get PDF
    his paper demonstrates how to interact with a conversational agent that speaks through an actual human body face-to-face and in person (i.e., offscreen). This is made possible by the cyranoid method: a technique involving a human person speech shadowing for a remote third-party (i.e., receiving their words via a covert audio-relay apparatus and repeating them aloud in real-time). When a person shadows for an artificial conversational agent source, we call the resulting hybrid an “echoborg.” We report a study in which people encountered conversational agents either through a human shadower face-to-face or via a text interface under conditions where they assumed their interlocutor to be an actual person. Our results show that the perception of a conversational agent is dramatically altered when the agent is voiced by an actual, tangible person. We discuss the potential implications this methodology has for the development of conversational agents and general person perception research

    A complementary systems account of word learning: neural and behavioural evidence

    Get PDF
    In this paper we present a novel theory of the cognitive and neural processes by which adults learn new spoken words. This proposal builds on neurocomputational accounts of lexical processing and spoken word recognition and complementary learning systems (CLS) models of memory. We review evidence from behavioural studies of word learning that, consistent with the CLS account, show two stages of lexical acquisition: rapid initial familiarization followed by slow lexical consolidation. These stages map broadly onto two systems involved in different aspects of word learning: (i) rapid, initial acquisition supported by medial temporal and hippocampal learning, (ii) slower neocortical learning achieved by offline consolidation of previously acquired information. We review behavioural and neuroscientific evidence consistent with this account, including a meta-analysis of PET and functional Magnetic Resonance Imaging (fMRI) studies that contrast responses to spoken words and pseudowords. From this meta-analysis we derive predictions for the location and direction of cortical response changes following familiarization with pseudowords. This allows us to assess evidence for learning-induced changes that convert pseudoword responses into real word responses. Results provide unique support for the CLS account since hippocampal responses change during initial learning, whereas cortical responses to pseudowords only become word-like if overnight consolidation follows initial learning

    Neurophysiological evidence for rapid processing of verbal and gestural information in understanding communicative actions

    Get PDF
    During everyday social interaction, gestures are a fundamental part of human communication. The communicative pragmatic role of hand gestures and their interaction with spoken language has been documented at the earliest stage of language development, in which two types of indexical gestures are most prominent: the pointing gesture for directing attention to objects and the give-me gesture for making requests. Here we study, in adult human participants, the neurophysiological signatures of gestural-linguistic acts of communicating the pragmatic intentions of naming and requesting by simultaneously presenting written words and gestures. Already at ~150 ms, brain responses diverged between naming and request actions expressed by word-gesture combination, whereas the same gestures presented in isolation elicited their earliest neurophysiological dissociations significantly later (at ~210 ms). There was an early enhancement of request-evoked brain activity as compared with naming, which was due to sources in the frontocentral cortex, consistent with access to action knowledge in request understanding. In addition, an enhanced N400-like response indicated late semantic integration of gesture-language interaction. The present study demonstrates that word-gesture combinations used to express communicative pragmatic intentions speed up the brain correlates of comprehension processes – compared with gesture-only understanding – thereby calling into question current serial linguistic models viewing pragmatic function decoding at the end of a language comprehension cascade. Instead, information about the social-interactive role of communicative acts is processed instantaneously
    corecore